Preconditioned All-at-once Methods for Large, Sparse Parameter Estimation Problems

نویسندگان

  • E Haber
  • U M Ascher
چکیده

The problem of recovering a parameter function based on measurements of solutions of a system of partial diierential equations in several space variables leads to a number of computational challenges. Upon discretization of a regularized formulation a large, sparse constrained optimization problem is obtained. Typically in the literature , the constraints are eliminated and the resulting unconstrained formulation is solved by some variant of Newton's method, usually the Gauss-Newton method. A preconditioned conjugate gradient algorithm is applied at each iteration for the resulting reduced Hessian system. In this paper we apply instead a preconditioned Krylov method directly to the KKT system arising from a Newton-type method for the constrained formulation (an \all-at-once" approach). A variant of symmetric QMR is employed, and an eeective preconditioner is obtained by solving the reduced Hessian system approximately. Since the reduced Hessian system presents signiicant expense already in forming a matrix-vector product, the savings in doing so only approximately are substantial. The resulting preconditioner may be viewed as an incomplete block-LU decomposition, and we obtain conditions 1 guaranteeing bounds for the condition number of the preconditioned matrix. Numerical experiments are performed for the DC-resistivity and the magnetostatic problems in 3D, comparing the two approaches for solving the linear system at each Gauss-Newton iteration. A substantial eeciency gain is demonstrated. The relative eeciency of our proposed method is even higher in the context of inexact Newton-type methods, where the linear system at each iteration is solved less accurately.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Preconditioned Generalized Minimal Residual Method for Solving Fractional Advection-Diffusion Equation

Introduction Fractional differential equations (FDEs)  have  attracted much attention and have been widely used in the fields of finance, physics, image processing, and biology, etc. It is not always possible to find an analytical solution for such equations. The approximate solution or numerical scheme  may be a good approach, particularly, the schemes in numerical linear algebra for solving ...

متن کامل

A Parametric Simplex Approach to Statistical Learning Problems

In this paper, we show that the parametric simplex method is an efficient algorithm for solving various statistical learning problems that can be written as linear programs parametrized by a so-called regularization parameter. The parametric simplex method offers significant advantages over other methods: (1) it finds the complete solution path for all values of the regularization parameter by ...

متن کامل

Computing Covariance Matrices for Constrained Nonlinear Large Scale Parameter Estimation Problems Using Krylov Subspace Methods

In the paper we show how, based on the preconditioned Krylov subspace methods, to compute the covariance matrix of parameter estimates, which is crucial for efficient methods of optimum experimental design. Mathematics Subject Classification (2000). Primary 65K10; Secondary 15A09, 65F30.

متن کامل

Mammalian Eye Gene Expression Using Support Vector Regression to Evaluate a Strategy for Detecting Human Eye Disease

Background and purpose: Machine learning is a class of modern and strong tools that can solve many important problems that nowadays humans may be faced with. Support vector regression (SVR) is a way to build a regression model which is an incredible member of the machine learning family. SVR has been proven to be an effective tool in real-value function estimation. As a supervised-learning appr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000